29 research outputs found

    Practical Lattice Cryptosystems: NTRUEncrypt and NTRUMLS

    Get PDF
    Public key cryptography, as deployed on the internet today, stands on shaky ground. For over twenty years now it has been known that the systems in widespread use are insecure against adversaries equipped with quantum computers -- a fact that has largely been discounted due to the enormous challenge of building such devices. However, research into the development of quantum computers is accelerating and is producing an abundance of positive results that indicate quantum computers could be built in the near future. As a result, individuals, corporations and government entities are calling for the deployment of new cryptography to replace systems that are vulnerable to quantum cryptanalysis. Few satisfying schemes are to be found. This work examines the design, parameter selection, and cryptanalysis of a post-quantum public key encryption scheme, NTRUEncrypt, and a related signature scheme, NTRUMLS. It is hoped that this analysis will prove useful in comparing these schemes against other candidates that have been proposed to replace existing infrastructure

    An upper bound on the decryption failure rate of static-key NewHope

    Get PDF
    We give a new proof that the decryption failure rate of NewHope512 is at most 2−398.82^{-398.8}. As in previous work, this failure rate is with respect to random, honestly generated, secret key and ciphertext pairs. However, our technique can also be applied to a fixed secret key. We demonstrate our technique on some subsets of the NewHope1024 key space, and we identify a large subset of NewHope1024 keys with failure rates of no more than 2−439.52^{-439.5}

    Improving post-quantum cryptography through cryptanalysis

    Get PDF
    Large quantum computers pose a threat to our public-key cryptographic infrastructure. The possible responses are: Do nothing; accept the fact that quantum computers might be used to break widely deployed protocols. Mitigate the threat by switching entirely to symmetric-key protocols. Mitigate the threat by switching to different public-key protocols. Each user of public-key cryptography will make one of these choices, and we should not expect consensus. Some users will do nothing---perhaps because they view the threat as being too remote. And some users will find that they never needed public-key cryptography in the first place. The work that I present here is for people who need public-key cryptography and want to switch to new protocols. Each of the three articles raises the security estimate of a cryptosystem by showing that some attack is less effective than was previously believed. Each article thereby reduces the cost of using a protocol by letting the user choose smaller (or more efficient) parameters at a fixed level of security. In Part 1, I present joint work with Samuel Jaques in which we revise security estimates for the Supersingular Isogeny Key Exchange (SIKE) protocol. We show that known quantum claw-finding algorithms do not outperform classical claw-finding algorithms. This allows us to recommend 434-bit primes for use in SIKE at the same security level that 503-bit primes had previously been recommended. In Part 2, I present joint work with Martin Albrecht, Vlad Gheorghiu, and Eamonn Postelthwaite that examines the impact of quantum search on sieving algorithms for the shortest vector problem. Cryptographers commonly assume that the cost of solving the shortest vector problem in dimension dd is 2(0.265…+o(1))d2^{(0.265\ldots +o(1))d} quantumly and 2(0.292…+o(1))d2^{(0.292\ldots + o(1))d} classically. These are upper bounds based on a near neighbor search algorithm due to Becker--Ducas--Gama--Laarhoven. Naively, one might think that dd must be at least 483(≈128/0.265)483 (\approx 128/0.265) to avoid attacks that cost fewer than 21282^{128} operations. Our analysis accounts for terms in the o(1)o(1) that were previously ignored. In a realistic model of quantum computation, we find that applying the Becker--Ducas--Gama--Laarhoven algorithm in dimension d>376d > 376 will cost more than 21282^{128} operations. We also find reason to believe that the classical algorithm will outperform the quantum algorithm in dimensions d<288d < 288. In Part 3, I present solo work on a variant of post-quantum RSA. The original pqRSA proposal by Bernstein--Heninger--Lou--Valenta uses terabyte keys of the form n=p1p2p3p4⋯pi⋯p231n = p_1p_2p_3p_4\cdots p_i\cdots p_{2^{31}} where each pip_i is a 40964096-bit prime. My variant uses terabyte keys of the form n=p12p23p35p47⋯piπi⋯p20044225287n = p_1^2p_2^3p_3^5p_4^7\cdots p_i^{\pi_i}\cdots p_{20044}^{225287} where each pip_i is a 40964096-bit prime and πi\pi_i is the ii-th prime. Prime generation is the most expensive part of post-quantum RSA in practice, so the smaller number of prime factors in my proposal gives a large speedup in key generation. The repeated factors help an attacker identify an element of small order, and thereby allow the attacker to use a small-order variant of Shor's algorithm. I analyze small-order attacks and discuss the cost of the classical pre-computation that they require

    A Comparison of NTRU Variants

    Get PDF
    We analyze the size vs. security trade-offs that are available when selecting parameters for perfectly correct key encapsulation mechanisms based on NTRU

    Multi-power Post-quantum RSA

    Get PDF
    Special purpose factoring algorithms have discouraged the adoption of multi-power RSA, even in a post-quantum setting. We revisit the known attacks and find that a general recommendation against repeated factors is unwarranted. We find that one-terabyte RSA keys of the form n=p12p23p35p47⋯piπi⋯p20044225287n = p_1^2p_2^3p_3^5p_4^7\cdots p_i^{\pi_i}\cdots p_{20044}^{225287} are competitive with one-terabyte RSA keys of the form n=p1p2p3p4⋯pi⋯p231n = p_1p_2p_3p_4\cdots p_i\cdots p_{2^{31}}. Prime generation can be made to be a factor of 100000 times faster at a loss of at least 11 but not more than 1717 bits of security against known attacks. The range depends on the relative cost of bit and qubit operations under the assumption that qubit operations cost 2c2^c bit operations for some constant cc

    Estimating the cost of generic quantum pre-image attacks on SHA-2 and SHA-3

    Get PDF
    We investigate the cost of Grover's quantum search algorithm when used in the context of pre-image attacks on the SHA-2 and SHA-3 families of hash functions. Our cost model assumes that the attack is run on a surface code based fault-tolerant quantum computer. Our estimates rely on a time-area metric that costs the number of logical qubits times the depth of the circuit in units of surface code cycles. As a surface code cycle involves a significant classical processing stage, our cost estimates allow for crude, but direct, comparisons of classical and quantum algorithms. We exhibit a circuit for a pre-image attack on SHA-256 that is approximately 2153.82^{153.8} surface code cycles deep and requires approximately 212.62^{12.6} logical qubits. This yields an overall cost of 2166.42^{166.4} logical-qubit-cycles. Likewise we exhibit a SHA3-256 circuit that is approximately 2146.52^{146.5} surface code cycles deep and requires approximately 2202^{20} logical qubits for a total cost of, again, 2166.52^{166.5} logical-qubit-cycles. Both attacks require on the order of 21282^{128} queries in a quantum black-box model, hence our results suggest that executing these attacks may be as much as 275275 billion times more expensive than one would expect from the simple query analysis.Comment: Same as the published version to appear in the Selected Areas of Cryptography (SAC) 2016. Comments are welcome

    Estimating quantum speedups for lattice sieves

    Get PDF
    Quantum variants of lattice sieve algorithms are routinely used to assess the security of lattice based cryptographic constructions. In this work we provide a heuristic, non-asymptotic, analysis of the cost of several algorithms for near neighbour search on high dimensional spheres. These algorithms are key components of lattice sieves. We design quantum circuits for near neighbour search algorithms and provide software that numerically optimises algorithm parameters according to various cost metrics. Using this software we estimate the cost of classical and quantum near neighbour search on spheres. For the most performant near neighbour search algorithm that we analyse we find a small quantum speedup in dimensions of cryptanalytic interest. Achieving this speedup requires several optimistic physical and algorithmic assumptions

    Decryption failure is more likely after success

    Get PDF
    The user of an imperfectly correct lattice-based public-key encryption scheme leaks information about their secret key with each decryption query that they answer---even if they answer all queries successfully. Through a refinement of the D\u27Anvers--Guo--Johansson--Nilsson--Vercauteren--Verbauwhede failure boosting attack, we show that an adversary can use this information to improve his odds of finding a decryption failure. We also propose a new definition of δ\delta-correctness, and we re-assess the correctness of several submissions to NIST\u27s post-quantum standardization effort

    NTRU Modular Lattice Signature Scheme on CUDA GPUs

    Get PDF
    In this work we show how to use Graphics Processing Units (GPUs) with Compute Unified Device Architecture (CUDA) to accelerate a lattice based signature scheme, namely, the NTRU modular lattice signature (NTRU-MLS) scheme. Lattice based schemes require operations on large vectors that are perfect candidates for GPU implementations. In addition, similar to most lattice based signature schemes, NTRU-MLS provides transcript security with a rejection sampling technique. With a GPU implementation, we are able to generate many candidates simultaneously, and hence mitigate the performance slowdown from rejection sampling. Our implementation results show that for the original NTRU-MLS parameter sets, we obtain a 2x improvement in the signing speed; for the revised parameter sets, where acceptance rate of rejection sampling is down to around 1%, our implementation can be as much as 47x faster than a CPU implementation

    Transcript secure signatures based on modular lattices

    Get PDF
    We introduce a class of lattice-based digital signature schemes based on modular properties of the coordinates of lattice vectors. We also suggest a method of making such schemes transcript secure via a rejection sampling technique of Lyubashevsky (2009). A particular instantiation of this approach is given, using NTRU lattices. Although the scheme is not supported by a formal security reduction, we present arguments for its security and derive concrete parameters (first version) based on the performance of state-of-the-art lattice reduction and enumeration tech- niques. In the revision, we re-evaluate the security of first version of the parameter sets, under the hybrid approach of lattice reduction attack the meet-in-the-middle attack. We present new sets of parameters that are robust against this attack, as well as all previous known attacks
    corecore